probabilistic transducer
Adaptive Mixture of Probabilistic Transducers
We introduce and analyze a mixture model for supervised learning of probabilistic transducers. We devise an online learning algorithm that efficiently infers the structure and estimates the parameters of each model in the mixture. Theoretical analysis and comparative simulations indicate that the learning algorithm tracks the best model from an arbitrarily large (possibly infinite) pool of models. We also present an application of the model for inducing a noun phrase recognizer.
The brain as a probabilistic transducer: an evolutionarily plausible network architecture for knowledge representation, computation, and behavior
Halpern, Joseph Y., Lotem, Arnon
We offer a general theoretical framework for brain and behavior that is evolutionarily and computationally plausible. The brain in our abstract model is a network of nodes and edges. Although it has some similarities to standard neural network models, as we show, there are some significant differences. Both nodes and edges in our network have weights and activation levels. They act as probabilistic transducers that use a set of relatively simple rules to determine how activation levels and weights are affected by input, generate output, and affect each other. We show that these simple rules enable a learning process that allows the network to represent increasingly complex knowledge, and simultaneously to act as a computing device that facilitates planning, decision-making, and the execution of behavior. By specifying the innate (genetic) components of the network, we show how evolution could endow the network with initial adaptive rules and goals that are then enriched through learning. We demonstrate how the developing structure of the network (which determines what the brain can do and how well) is critically affected by the co-evolved coordination between the mechanisms affecting the distribution of data input and those determining the learning parameters (used in the programs run by nodes and edges). Finally, we consider how the model accounts for various findings in the field of learning and decision making, how it can address some challenging problems in mind and behavior, such as those related to setting goals and self-control, and how it can help understand some cognitive disorders.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- (2 more...)
- Health & Medicine > Therapeutic Area > Neurology (1.00)
- Health & Medicine > Consumer Health (0.93)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.92)
- Leisure & Entertainment (0.67)
Shared Context Probabilistic Transducers
Bengio, Yoshua, Bengio, Samy, Isabelle, Jean-Franc, Singer, Yoram
Recently, a model for supervised learning of probabilistic transducers representedby suffix trees was introduced. However, this algorithm tendsto build very large trees, requiring very large amounts of computer memory. In this paper, we propose anew, more compact, transducermodel in which one shares the parameters of distributions associatedto contexts yielding similar conditional output distributions. We illustrate the advantages of the proposed algorithm withcomparative experiments on inducing a noun phrase recogmzer.
- North America > Canada > Quebec > Montreal (0.05)
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
Shared Context Probabilistic Transducers
Bengio, Yoshua, Bengio, Samy, Isabelle, Jean-Franc, Singer, Yoram
Recently, a model for supervised learning of probabilistic transducers represented by suffix trees was introduced. However, this algorithm tends to build very large trees, requiring very large amounts of computer memory. In this paper, we propose anew, more compact, transducer model in which one shares the parameters of distributions associated to contexts yielding similar conditional output distributions. We illustrate the advantages of the proposed algorithm with comparative experiments on inducing a noun phrase recogmzer.
- North America > Canada > Quebec > Montreal (0.05)
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
Shared Context Probabilistic Transducers
Bengio, Yoshua, Bengio, Samy, Isabelle, Jean-Franc, Singer, Yoram
Recently, a model for supervised learning of probabilistic transducers represented by suffix trees was introduced. However, this algorithm tends to build very large trees, requiring very large amounts of computer memory. In this paper, we propose anew, more compact, transducer model in which one shares the parameters of distributions associated to contexts yielding similar conditional output distributions. We illustrate the advantages of the proposed algorithm with comparative experiments on inducing a noun phrase recogmzer.
- North America > Canada > Quebec > Montreal (0.05)
- North America > United States > New York (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)